A Truncated Descent HS Conjugate Gradient Method and Its Global Convergence
نویسندگان
چکیده
منابع مشابه
A Truncated Descent HS Conjugate Gradient Method and Its Global Convergence
Recently, Zhang 2006 proposed a three-term modified HS TTHS method for unconstrained optimization problems. An attractive property of the TTHS method is that the direction generated by the method is always descent. This property is independent of the line search used. In order to obtain the global convergence of the TTHS method, Zhang proposed a truncated TTHS method. A drawback is that the num...
متن کاملA descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
In this paper, we propose a modified Polak–Ribière–Polyak (PRP) conjugate gradient method. An attractive property of the proposed method is that the direction generated by the method is always a descent direction for the objective function. This property is independent of the line search used. Moreover, if exact line search is used, the method reduces to the ordinary PRP method. Under appropria...
متن کاملGlobal convergence of a modified spectral FR conjugate gradient method
A modified spectral PRP conjugate gradient method is presented for solving unconstrained optimization problems. The constructed search direction is proved to be a sufficiently descent direction of the objective function. With an Armijo-type line search to determinate the step length, a new spectral PRP conjugate algorithm is developed. Under some mild conditions, the theory of global convergenc...
متن کاملGlobal Convergence of a Modified Liu-storey Conjugate Gradient Method
In this paper, we make a modification to the LS conjugate gradient method and propose a descent LS method. The method can generates sufficient descent direction for the objective function. We prove that the method is globally convergent with an Armijo-type line search. Moreover, under mild conditions, we show that the method is globally convergent if the Armijo line search or the Wolfe line sea...
متن کاملA New Descent Nonlinear Conjugate Gradient Method for Unconstrained Optimization
In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ensures the Zoutendijk condition to be held, this method is proved to be globally convergent. Finally, we improve it, and do further analysis.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Mathematical Problems in Engineering
سال: 2009
ISSN: 1024-123X,1563-5147
DOI: 10.1155/2009/875097